AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Japanese Instruction Optimization

# Japanese Instruction Optimization

ABEJA Qwen2.5 7b Japanese V0.1
Apache-2.0
A model trained on Japanese based on Qwen/Qwen2.5-7B-Instruct, enhanced through distillation learning to improve instruction-following performance.
Large Language Model Transformers Japanese
A
abeja
521
6
Swallow 70B Instruct GGUF
Swallow 70B Instruct is a powerful language model that provides model files in GGUF format, supports multiple clients and libraries, and can meet text generation needs in different scenarios.
Large Language Model Transformers Supports Multiple Languages
S
TheBloke
366
9
Japanese Gpt Neox 3.6b Instruction Ppo
MIT
A 3.6 billion parameter Japanese GPT-NeoX model trained with Reinforcement Learning from Human Feedback (RLHF), capable of better following instructions in conversations.
Large Language Model Transformers Supports Multiple Languages
J
rinna
3,062
71
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase